-
Notifications
You must be signed in to change notification settings - Fork 604
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[bugfix] Probabilities do not sum to one with Torch #5462
Conversation
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## master #5462 +/- ##
==========================================
- Coverage 99.68% 99.67% -0.01%
==========================================
Files 402 402
Lines 37540 37287 -253
==========================================
- Hits 37420 37166 -254
- Misses 120 121 +1 ☔ View full report in Codecov by Sentry. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Test?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks great! Thanks for taking this on 🚀
Context: When computing the
expval
of an operator using a quantum device with thetorch
interface anddefault_dtype
set totorch.float32
, the probabilities do not sum to one. This error does not occur ifdefault_dtype
is set totorch.float64
.Description of the Change: A renormalization of probabilities is introduced to overcome the issue. Such renormalization of probabilities occurs whenever the following two conditions are satisfied: 1) There is at least one probability that does not sum precisely to one, and 2) Such difference for all probabilities is between
0
and1e-07
.Benefits: The error is not raised anymore.
Possible Drawbacks: The main drawback is that renormalization can occur in cases where it should not happen, although it is unlikely since the cutoff
1e-07
is expected to be reasonably small to prevent such cases (but large enough not to raise the error anymore).Related GitHub Issues: #5444
[sc-59957]